8 research outputs found

    Design for 360-degree 3D light-field camera and display

    No full text
    © 2018 The Author(s). A 360-degree 3D light-field acquisition and display system is proposed. Unlike conventional setups, recording and displaying rectangular volumes, proposed configuration captures and displays a cylindrical volume. Application of display stage for head-up displays is discussed

    Design of micro photon sieve arrays for high resolution light-field capture in plenoptic cameras

    No full text
    © 2018 IEEE. The design of micro photon sieve arrays (PSAs) is investigated for light-field capture with high spatial resolution in plenoptic cameras. A commercial very high-resolution full-frame camera with a manual lens is converted into a plenoptic camera for high-resolution depth image acquisition by using the designed PSA as an add-on diffractive optical element in place of an ordinary refractive microlens array or a diffractive micro Fresnel Zone Plate (FZP) array, which is used in integral imaging applications. The noise introduced by the diffractive nature of the optical element is reduced by standard image processing tools. The light-field data is also used for computational refocusing of the 3D scene with wave propagation tools

    A reciprocal 360-degree 3D light-field image acquisition and display system

    No full text
    A reciprocal 360-degree three-dimensional light-field image acquisition and display system was designed using a common catadioptric optical configuration and a lens array. Proof-of-concept experimental setups were constructed with a full capturing part and a truncated display section to demonstrate that the proposed design works without loss of generality. Unlike conventional setups, which record and display rectangular volumes, the proposed configuration records 3D images from its surrounding spherical volume in the capture mode and project 3D images to the same spherical volume in the display mode. This is particularly advantageous in comparison to other 360-degree multi-camera and multiple projector display systems which require extensive image and physical calibration. We analysed the system and showed the quality measures such as angular resolution and space bandwidth product based on design parameters. The issue due to the pixel size difference between the available imaging sensor and the display was also addressed. A diffractive microlens array matching the sensor size is used in the acquisition part whereas a vacuum cast lens array matching the display size is used in the display part with scaled optics. The experimental results demonstrate the proposed system design works well and in good agreement with the simulation results

    Prospective immersive human-machine interface for future vehicles: Multiple Zones Turn the Full Windscreen Into a Head-Up Display

    No full text
    The physical bottleneck of optical design and the complexity associated with the human visual system (HVS) limits the true potential of the head-up display (HUD) for use in vehicles. A full windscreen of visual information feedback is required for driving safety, and its implementation to enhance the driving experience is one of the most significant challenges. We present an immersive augmented reality (AR) HUD concept to support future vehicle design following a human-centric design approach. The limited field of view of contemporary optical solutions can be overcome by multiple display elements tiled to utilize images over the entire windscreen and so create an immersive experience. The design takes important human factors into account and improves the operator’s driving experience. Furthermore, the images are “distributed,” meaning that the physical interface generates the images via multiple optical apertures/image sources and displays these images according to HVS requirements. These configurations are tested in a laboratory environment with a replica of a real car interior and a prototype vehicle installed with distributed multiple HUD units. The proposed concept of an immersive human–machine interface (HMI) can be further extended in various forms to other parts of the vehicle interior, including surfaces and free space, which we envisage will take place in future car designs

    Optical reconstruction of transparent objects with phase-only SLMs

    Get PDF
    Three approaches for visualization of transparent micro-objects from holographic data using phase-only SLMs are described. The objects are silicon micro-lenses captured in the near infrared by means of digital holographic microscopy and a simulated weakly refracting 3D object with size in the micrometer range. In the first method, profilometric/tomographic data are retrieved from captured holograms and converted into a 3D point cloud which allows for computer generation of multi-view phase holograms using Rayleigh-Sommerfeld formulation. In the second method, the microlens is computationally placed in front of a textured object to simulate the image of the textured data as seen through the lens. In the third method, direct optical reconstruction of the micrometer object through a digital lens by modifying the phase with the Gerchberg-Saxton algorithm is achieved. © 2013 Optical Society of America

    Infrared digital holography applications for virtual museums and diagnostics of cultural heritage

    Get PDF
    Infrared digital holograms of different statuettes are acquired. For each object, a sequence of holograms is recorded rotating the statuette with an angular step of few degrees. The holograms of the moving objects are used to compose dynamic 3D scenes that, then, are optically reconstructed by means of spatial light modulators (SLMs) using an illumination wavelength of 532 nm. This kind of reconstruction allows to obtain a 3D imaging of the statuettes that could be exploited for virtual museums. © 2011 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE)
    corecore